feat: 支持 OpenAI Responses API (/v1/responses) 渠道类型#91
feat: 支持 OpenAI Responses API (/v1/responses) 渠道类型#91zhangzhenfei wants to merge 1 commit intoErlichLiu:mainfrom
Conversation
新增 openai-responses Provider,接入 OpenAI 于 2025 年发布的新版 Responses API,与现有 Chat Completions 渠道并存、互不影响。 核心变更: - 新增 ResponsesAdapter(packages/core):处理 /v1/responses 端点的 请求构建与 SSE 解析,支持文本流、推理内容、工具调用三种事件类型 - 请求格式差异:input 替代 messages,instructions 替代 system role, 工具定义采用扁平结构(无嵌套 function 对象) - 工具续接:使用 function_call + function_call_output 输入项, 通过 metadata.call_id 保留 call_id 供回传匹配 - shared 类型扩展:ProviderType 联合类型、默认 URL、显示名称 - channel-manager:三处 switch 新增 case 'openai-responses' - ChannelForm UI:供应商选项与端点预览均已支持 Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
JiwaniZakir
left a comment
There was a problem hiding this comment.
In channel-manager.ts, openai-responses is added as a fall-through to the existing openai test path (testChannel and testChannelDirect), but that test function almost probes the /chat/completions endpoint — not /responses. This means a valid openai-responses channel could fail its connectivity test, while a misconfigured one might pass if the base URL happens to serve Chat Completions. A dedicated test path (or at minimum passing the resolved endpoint URL into the shared test helper) would be more correct here.
In ChannelForm.tsx, PROVIDER_CHAT_PATHS['openai-responses'] is set to '/responses' (no /v1 prefix), which is consistent with how openai: '/chat/completions' works given that the default base URL already includes /v1. However, if a user sets a custom base URL that doesn't include /v1, they'll silently get the wrong path — worth a comment clarifying the assumption, or a validation note in the UI.
The new ResponsesAdapter type definitions look thorough — modeling ResponsesFunctionCallItem and ResponsesFunctionCallOutputItem as distinct input item types rather than overloading the role-based message structure is the right call for this API's format. It would be worth verifying that the id field on ResponsesFunctionCallItem (marked optional) is correctly round-tripped when the API returns it, since omitting it on continuation requests could cause the API to treat it as a new call rather than a resumption.
Summary
openai-responsesProvider 类型,支持 OpenAI Responses API (/v1/responses) 端点ResponsesAdapter,处理 Responses API 特有的消息格式、SSE 事件解析、工具调用续接input(非字符串),兼容各类中转站变更文件
packages/core/src/providers/responses-adapter.ts— 新增 Responses API 适配器packages/core/src/providers/index.ts— 注册openai-responses适配器packages/shared/src/types/channel.ts— 新增openai-responsesProvider 类型apps/electron/src/main/lib/channel-manager.ts— 支持openai-responses渠道连接测试apps/electron/src/renderer/components/settings/ChannelForm.tsx— 渠道表单展示支持Test plan
openai-responses类型渠道,配置 API Key 和 Base URL🤖 Generated with Claude Code